{ "cells": [ { "cell_type": "markdown", "id": "10108f95", "metadata": {}, "source": [ "# はじめに" ] }, { "cell_type": "markdown", "id": "fb70e585", "metadata": {}, "source": [ "## 最良買い気配値と最良売り気配値の表示" ] }, { "cell_type": "code", "execution_count": 1, "id": "0f0e1b5a", "metadata": {}, "outputs": [], "source": [ "from numba import njit\n", "\n", "import numpy as np\n", "\n", "# numba.njit is strongly recommended for fast backtesting.\n", "@njit\n", "def print_bbo(hbt):\n", " # Iterating until hftbacktest reaches the end of data.\n", " # Elapses 60-sec every iteration.\n", " # Time unit is the same as data's timestamp's unit.\n", " # Timestamp of the sample data is in nanoseconds.\n", " while hbt.elapse(60 * 1e9) == 0: \n", " # Gets the market depth for the first asset.\n", " depth = hbt.depth(0)\n", "\n", " # Prints the best bid and the best offer.\n", " print(\n", " 'current_timestamp:', hbt.current_timestamp,\n", " ', best_bid:', np.round(depth.best_bid, 1),\n", " ', best_ask:', np.round(depth.best_ask, 1)\n", " )\n", " return True" ] }, { "cell_type": "code", "execution_count": 2, "id": "d9e717ec", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import BacktestAsset, HashMapMarketDepthBacktest\n", "\n", "asset = (\n", " BacktestAsset()\n", " # Sets the data to feed for this asset.\n", " #\n", " # Due to the vast size of tick-by-tick market depth and trade data,\n", " # loading the entire dataset into memory can be challenging,\n", " # particularly when backtesting across multiple days.\n", " # HftBacktest offers lazy loading support and is compatible with npy and preferably npz.\n", " #\n", " # For details on the normalized feed data, refer to the following documents.\n", " # * https://hftbacktest.readthedocs.io/en/latest/data.html \n", " # * https://hftbacktest.readthedocs.io/en/latest/tutorials/Data%20Preparation.html\n", " .data(['usdm/btcusdt_20240809.npz'])\n", " # Sets the initial snapshot (optional).\n", " .initial_snapshot('usdm/btcusdt_20240808_eod.npz')\n", " # Asset type:\n", " # * Linear\n", " # * Inverse.\n", " # 1.0 represents the contract size, which is the value of the asset per quoted price.\n", " .linear_asset(1.0) \n", " # HftBacktest provides two built-in latency models.\n", " # * constant_latency\n", " # * intp_order_latency\n", " # To implement your own latency model, please use Rust.\n", " # \n", " # Time unit is the same as data's timestamp's unit. Timestamp of the sample data is in nanoseconds.\n", " # Sets the order entry latency and response latency to 10ms.\n", " .constant_latency(10_000_000, 10_000_000)\n", " # HftBacktest provides several types of built-in queue position models.\n", " # Please find the details in the documents below.\n", " # https://hftbacktest.readthedocs.io/en/latest/tutorials/Probability%20Queue%20Models.html\n", " #\n", " # To implement your own queue position model, please use Rust.\n", " .risk_adverse_queue_model() \n", " # HftBacktest provides two built-in exchange models.\n", " # * no_partial_fill_exchange\n", " # * partial_fill_exchange\n", " # To implement your own exchange model, please use Rust.\n", " .no_partial_fill_exchange()\n", " # HftBacktest provides several built-in fee models.\n", " # * trading_value_fee_model\n", " # * trading_qty_fee_model\n", " # * flat_per_trade_fee_model\n", " #\n", " # 0.02% maker fee and 0.07% taker fee. If the fee is negative, it represents a rebate.\n", " # For example, -0.00005 represents a 0.005% rebate for the maker order.\n", " .trading_value_fee_model(0.0002, 0.0007)\n", " # Tick size of this asset: minimum price increasement\n", " .tick_size(0.1)\n", " # Lot size of this asset: minimum trading unit.\n", " .lot_size(0.001)\n", " # Sets the capacity of the vector that stores trades occurring in the market.\n", " # If you set the size, you need call `clear_last_trades` to clear the vector.\n", " # A value of 0 indicates that no market trades are stored. (Default)\n", " .last_trades_capacity(0)\n", ")\n", "\n", "# HftBacktest provides several types of built-in market depth implementations.\n", "# HashMapMarketDepthBacktest constructs a Backtest using a HashMap-based market depth implementation.\n", "# Another useful implementation is ROIVectorMarketDepth, which is utilized in ROIVectorMarketDepthBacktest.\n", "# Please find the details in the document below.\n", "hbt = HashMapMarketDepthBacktest([asset])" ] }, { "cell_type": "markdown", "id": "0d4ecb19", "metadata": {}, "source": [ "最良買い気配値と最良売り気配値を60秒ごとに確認できます。価格は32ビット浮動小数点数であるため、浮動小数点誤差が発生する可能性があります。使用する際には注意してください。例では、読みやすさのために価格がティックサイズに基づいて丸められています。" ] }, { "cell_type": "code", "execution_count": 3, "id": "128fb48a", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , best_bid: 61594.1 , best_ask: 61594.2\n", "current_timestamp: 1723161721500000000 , best_bid: 61576.5 , best_ask: 61576.6\n", "current_timestamp: 1723161781500000000 , best_bid: 61629.6 , best_ask: 61629.7\n", "current_timestamp: 1723161841500000000 , best_bid: 61621.5 , best_ask: 61621.6\n", "current_timestamp: 1723161901500000000 , best_bid: 61583.9 , best_ask: 61584.0\n" ] }, { "data": { "text/plain": [ "True" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "print_bbo(hbt)" ] }, { "cell_type": "markdown", "id": "5b316f2f-433c-423e-83a3-95451f4951e8", "metadata": {}, "source": [ "HftBacktestは再利用できません。したがって、バックテストを使用した後は必ず閉じてください。閉じた後にバックテストを使用するとクラッシュします。" ] }, { "cell_type": "code", "execution_count": 4, "id": "7a2be482-1796-4aa3-8781-8fc991e93064", "metadata": {}, "outputs": [], "source": [ "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "7754e06a", "metadata": {}, "source": [ "## データのフィード" ] }, { "cell_type": "markdown", "id": "b0003e60", "metadata": {}, "source": [ "十分なメモリを持っている場合、データをメモリに事前ロードし、繰り返しバックテスト中に遅延ロードするよりも効率的に入力することができます。" ] }, { "cell_type": "code", "execution_count": 5, "id": "7dca08d3-4434-4410-9952-8b20ffe07670", "metadata": {}, "outputs": [], "source": [ "btcusdt_20230809 = np.load('usdm/btcusdt_20240809.npz')['data']\n", "btcusdt_20230808_eod = np.load('usdm/btcusdt_20240808_eod.npz')['data']\n", "\n", "asset = (\n", " BacktestAsset()\n", " .data([btcusdt_20230809])\n", " .initial_snapshot(btcusdt_20230808_eod)\n", " .linear_asset(1.0) \n", " .constant_latency(10_000_000, 10_000_000)\n", " .risk_adverse_queue_model() \n", " .no_partial_fill_exchange()\n", " .trading_value_fee_model(0.0002, 0.0007)\n", " .tick_size(0.1)\n", " .lot_size(0.001)\n", ")" ] }, { "cell_type": "code", "execution_count": 6, "id": "d4b907f3-fc83-4ca9-8c95-7475f2bc0e3c", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , best_bid: 61594.1 , best_ask: 61594.2\n", "current_timestamp: 1723161721500000000 , best_bid: 61576.5 , best_ask: 61576.6\n", "current_timestamp: 1723161781500000000 , best_bid: 61629.6 , best_ask: 61629.7\n", "current_timestamp: 1723161841500000000 , best_bid: 61621.5 , best_ask: 61621.6\n", "current_timestamp: 1723161901500000000 , best_bid: 61583.9 , best_ask: 61584.0\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "print_bbo(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "6ec3131e", "metadata": {}, "source": [ "## 市場深度の取得" ] }, { "cell_type": "code", "execution_count": 7, "id": "467b649d", "metadata": {}, "outputs": [], "source": [ "@njit\n", "def print_3depth(hbt):\n", " while hbt.elapse(60 * 1e9) == 0:\n", " print('current_timestamp:', hbt.current_timestamp)\n", "\n", " # Gets the market depth for the first asset, in the same order as when you created the backtest.\n", " depth = hbt.depth(0)\n", "\n", " # a key of bid_depth or ask_depth is price in ticks.\n", " # (integer) price_tick = price / tick_size\n", " i = 0\n", " for tick_price in range(depth.best_ask_tick, depth.best_ask_tick + 100):\n", " qty = depth.ask_qty_at_tick(tick_price)\n", " if qty > 0:\n", " print(\n", " 'ask: ',\n", " qty,\n", " '@',\n", " np.round(tick_price * depth.tick_size, 1)\n", " )\n", " \n", " i += 1\n", " if i == 3:\n", " break\n", " i = 0\n", " for tick_price in range(depth.best_bid_tick, max(depth.best_bid_tick - 100, 0), -1):\n", " qty = depth.bid_qty_at_tick(tick_price)\n", " if qty > 0:\n", " print(\n", " 'bid: ',\n", " qty,\n", " '@',\n", " np.round(tick_price * depth.tick_size, 1)\n", " )\n", " \n", " i += 1\n", " if i == 3:\n", " break\n", " return True" ] }, { "cell_type": "code", "execution_count": 8, "id": "74311eb7-ecb5-4c61-b677-f8373a9169d4", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000\n", "ask: 1.759 @ 61594.2\n", "ask: 0.006 @ 61594.4\n", "ask: 0.114 @ 61595.2\n", "bid: 3.526 @ 61594.1\n", "bid: 0.016 @ 61594.0\n", "bid: 0.002 @ 61593.9\n", "current_timestamp: 1723161721500000000\n", "ask: 2.575 @ 61576.6\n", "ask: 0.004 @ 61576.7\n", "ask: 0.455 @ 61577.0\n", "bid: 2.558 @ 61576.5\n", "bid: 0.002 @ 61576.0\n", "bid: 0.515 @ 61575.5\n", "current_timestamp: 1723161781500000000\n", "ask: 0.131 @ 61629.7\n", "ask: 0.005 @ 61630.1\n", "ask: 0.005 @ 61630.5\n", "bid: 5.742 @ 61629.6\n", "bid: 0.247 @ 61629.4\n", "bid: 0.034 @ 61629.3\n", "current_timestamp: 1723161841500000000\n", "ask: 0.202 @ 61621.6\n", "ask: 0.002 @ 61622.5\n", "ask: 0.003 @ 61622.6\n", "bid: 3.488 @ 61621.5\n", "bid: 0.86 @ 61620.0\n", "bid: 0.248 @ 61619.6\n", "current_timestamp: 1723161901500000000\n", "ask: 1.397 @ 61584.0\n", "ask: 0.832 @ 61585.1\n", "ask: 0.132 @ 61586.0\n", "bid: 3.307 @ 61583.9\n", "bid: 0.01 @ 61583.8\n", "bid: 0.002 @ 61582.0\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "print_3depth(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "b27f7386", "metadata": {}, "source": [ "## 注文の提出" ] }, { "cell_type": "code", "execution_count": 9, "id": "d5e29cfe", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import LIMIT, GTC, NONE, NEW, FILLED, CANCELED, EXPIRED\n", "\n", "@njit\n", "def print_orders(hbt):\n", " # You can access open orders and also closed orders via hbt.orders.\n", " # Gets the OrderDict for the first asset.\n", " orders = hbt.orders(0)\n", " \n", " # hbt.orders is a dictionary, but be aware that it does not support all dict methods, and its keys are order_id (int).\n", " order_values = orders.values()\n", " while order_values.has_next():\n", " order = order_values.get()\n", " \n", " order_status = ''\n", " if order.status == NONE:\n", " order_status = 'NONE' # Exchange hasn't received an order yet.\n", " elif order.status == NEW:\n", " order_status = 'NEW'\n", " elif order.status == FILLED:\n", " order_status = 'FILLED'\n", " elif order.status == CANCELED:\n", " order_status = 'CANCELED'\n", " elif order.status == EXPIRED:\n", " order_status = 'EXPIRED' \n", " \n", " order_req = ''\n", " if order.req == NONE:\n", " order_req = 'NONE'\n", " elif order.req == NEW:\n", " order_req = 'NEW'\n", " elif order.req == CANCELED:\n", " order_req = 'CANCEL'\n", " \n", " print(\n", " 'current_timestamp:', hbt.current_timestamp, \n", " ', order_id:', order.order_id,\n", " ', order_price:', np.round(order.price, 1),\n", " ', order_qty:', order.qty,\n", " ', order_status:', order_status,\n", " ', order_req:', order_req\n", " )\n", "\n", "@njit\n", "def submit_order(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(30 * 1e9) == 0:\n", " # Prints open orders.\n", " print_orders(hbt)\n", "\n", " depth = hbt.depth(0)\n", " \n", " if not is_order_submitted:\n", " # Submits a buy order at 300 ticks below the best bid for the first asset.\n", " order_id = 1\n", " order_price = depth.best_bid - 300 * depth.tick_size\n", " order_qty = 1\n", " time_in_force = GTC # Good 'till cancel\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "code", "execution_count": 10, "id": "ade32153", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161691500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161721500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161751500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161781500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161811500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161841500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161871500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161901500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "submit_order(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "edcdf5c0", "metadata": {}, "source": [ "## 非アクティブな注文のクリア (FILLED, CANCELED, EXPIRED)" ] }, { "cell_type": "code", "execution_count": 11, "id": "f163eef1", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import GTC\n", "\n", "@njit\n", "def clear_inactive_orders(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(30 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " # Removes inactive(FILLED, CANCELED, EXPIRED) orders from hbt.orders for the first asset.\n", " hbt.clear_inactive_orders(0)\n", "\n", " depth = hbt.depth(0)\n", " \n", " if not is_order_submitted:\n", " order_id = 1\n", " order_price = depth.best_bid - 300 * depth.tick_size\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "code", "execution_count": 12, "id": "821f9e56-5611-4e58-8b88-611fd277144f", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , order_id: 1 , order_price: 61643.8 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "clear_inactive_orders(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "0d6d85dd", "metadata": {}, "source": [ "## 注文ステータスの監視 - 注文遅延による保留中" ] }, { "cell_type": "code", "execution_count": 13, "id": "859bca22", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import GTC\n", "\n", "@njit\n", "def watch_pending(hbt):\n", " is_order_submitted = False\n", " # Elapses 0.01-sec every iteration.\n", " while hbt.elapse(0.01 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", "\n", " depth = hbt.depth(0)\n", " \n", " if not is_order_submitted:\n", " order_id = 1\n", " order_price = depth.best_bid - 300 * depth.tick_size\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " is_order_submitted = True\n", " \n", " # Prevents too many prints\n", " orders = hbt.orders(0)\n", " order = orders.get(order_id)\n", " if order.status == NEW:\n", " return False\n", " return True" ] }, { "cell_type": "markdown", "id": "8ea8e3c4-b972-4472-ab33-2f79eba54035", "metadata": {}, "source": [ "`order_status`は受け入れメッセージが受信されるまで`None`です。" ] }, { "cell_type": "code", "execution_count": 14, "id": "a0c2e47e", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161601520000000 , order_id: 1 , order_price: 61629.7 , order_qty: 1.0 , order_status: NONE , order_req: NEW\n", "current_timestamp: 1723161601530000000 , order_id: 1 , order_price: 61629.7 , order_qty: 1.0 , order_status: NEW , order_req: NONE\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "watch_pending(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "00c1f5ed", "metadata": {}, "source": [ "## 注文応答の待機" ] }, { "cell_type": "code", "execution_count": 15, "id": "b2e43564", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import GTC\n", "\n", "@njit\n", "def wait_for_order_response(hbt):\n", " order_id = 0\n", " is_order_submitted = False\n", " while hbt.elapse(0.01 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", " \n", " # Prevents too many prints\n", " orders = hbt.orders(0)\n", " if order_id in orders:\n", " if orders.get(order_id).status == NEW:\n", " return False\n", "\n", " depth = hbt.depth(0)\n", " \n", " if not is_order_submitted:\n", " order_id = 1\n", " order_price = depth.best_bid\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " # Waits for the order response for a given order id for the first asset.\n", " print('an order is submitted at', hbt.current_timestamp)\n", "\n", " # Timeout is set 1-second.\n", " hbt.wait_order_response(0, order_id, 1 * 1e9)\n", " print('an order response is received at', hbt.current_timestamp)\n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "markdown", "id": "41d43022-fcef-4d73-a8fd-4d94ad1b8e29", "metadata": {}, "source": [ "`ConstantLatency`モデルが使用されているため、往復遅延は正確に200msです。理想的には、ライブ市場から収集された履歴注文遅延データを使用するのが最良のアプローチです。ただし、このデータが利用できない場合は、フィード遅延に基づいて人工的に生成された注文遅延を使用することから始めるのも一つの方法です。これについては、次の例で詳しく説明します。" ] }, { "cell_type": "code", "execution_count": 16, "id": "fcfb211b", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "an order is submitted at 1723161601510000000\n", "an order response is received at 1723161601530000000\n", "current_timestamp: 1723161601540000000 , order_id: 1 , order_price: 61659.7 , order_qty: 1.0 , order_status: NEW , order_req: NONE\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "wait_for_order_response(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "b8c27280", "metadata": {}, "source": [ "## ポジション、残高、手数料、エクイティの表示" ] }, { "cell_type": "code", "execution_count": 17, "id": "c38365d8", "metadata": {}, "outputs": [], "source": [ "@njit\n", "def position(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(60 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", " \n", " # Prints position\n", " print(\n", " 'current_timestamp:', hbt.current_timestamp,\n", " ', position:', hbt.position(0),\n", " ', balance:', hbt.state_values(0).balance,\n", " ', fee:', hbt.state_values(0).fee\n", " )\n", "\n", " depth = hbt.depth(0)\n", " \n", " if not is_order_submitted:\n", " order_id = 1\n", " order_price = depth.best_bid\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " \n", " # Timeout is set 1-second.\n", " hbt.wait_order_response(0, order_id, 1e9)\n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "code", "execution_count": 18, "id": "a9cf70b2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "current_timestamp: 1723161721520000000 , order_id: 1 , order_price: 61594.1 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161721520000000 , position: 1.0 , balance: -61594.100000000006 , fee: 12.318820000000002\n", "current_timestamp: 1723161781520000000 , position: 1.0 , balance: -61594.100000000006 , fee: 12.318820000000002\n", "current_timestamp: 1723161841520000000 , position: 1.0 , balance: -61594.100000000006 , fee: 12.318820000000002\n", "current_timestamp: 1723161901520000000 , position: 1.0 , balance: -61594.100000000006 , fee: 12.318820000000002\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "position(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "7d6c739c", "metadata": {}, "source": [ "## オープン注文のキャンセル" ] }, { "cell_type": "code", "execution_count": 19, "id": "502bf0f6", "metadata": {}, "outputs": [], "source": [ "@njit\n", "def submit_and_cancel_order(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(0.1 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", " \n", " # Cancels if there is an open order\n", " orders = hbt.orders(0)\n", " order_values = orders.values()\n", " while order_values.has_next():\n", " order = order_values.get()\n", " \n", " # an order is only cancellable if order status is NEW.\n", " # cancel request is negated if the order is already filled or filled before cancel request is processed.\n", " if order.cancellable:\n", " hbt.cancel(0, order.order_id, False)\n", " # You can see status still NEW and see req CANCEL.\n", " print_orders(hbt)\n", " # cancels request also has order entry/response latencies the same as submitting.\n", " hbt.wait_order_response(0, order.order_id, 1e9)\n", " \n", " if not is_order_submitted:\n", " depth = hbt.depth(0)\n", " \n", " order_id = 1\n", " order_price = depth.best_bid - 100 * depth.tick_size\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = LIMIT\n", " hbt.submit_buy_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " \n", " # Timeout is set 1-second.\n", " hbt.wait_order_response(0, order_id, 1e9)\n", " is_order_submitted = True\n", " else:\n", " if len(hbt.orders(0)) == 0:\n", " return False\n", " return True" ] }, { "cell_type": "code", "execution_count": 20, "id": "638c3012", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161601720000000 , order_id: 1 , order_price: 61649.7 , order_qty: 1.0 , order_status: NEW , order_req: NONE\n", "current_timestamp: 1723161601720000000 , order_id: 1 , order_price: 61649.7 , order_qty: 1.0 , order_status: NEW , order_req: CANCEL\n", "current_timestamp: 1723161601840000000 , order_id: 1 , order_price: 61649.7 , order_qty: 1.0 , order_status: CANCELED , order_req: NONE\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "submit_and_cancel_order(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "6a60476c", "metadata": {}, "source": [ "## 成行注文" ] }, { "cell_type": "code", "execution_count": 21, "id": "9e87515b", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import MARKET\n", "\n", "@njit\n", "def print_orders_exec_price(hbt):\n", " orders = hbt.orders(0)\n", " order_values = orders.values()\n", " while order_values.has_next():\n", " order = order_values.get()\n", " \n", " order_status = ''\n", " if order.status == NONE:\n", " order_status = 'NONE'\n", " elif order.status == NEW:\n", " order_status = 'NEW'\n", " elif order.status == FILLED:\n", " order_status = 'FILLED'\n", " elif order.status == CANCELED:\n", " order_status = 'CANCELED'\n", " elif order.status == EXPIRED:\n", " order_status = 'EXPIRED' \n", " \n", " order_req = ''\n", " if order.req == NONE:\n", " order_req = 'NONE'\n", " elif order.req == NEW:\n", " order_req = 'NEW'\n", " elif order.req == CANCELED:\n", " order_req = 'CANCEL'\n", " \n", " print(\n", " 'current_timestamp:', hbt.current_timestamp, \n", " ', order_id:', order.order_id,\n", " ', order_price:', np.round(order.price, 1),\n", " ', order_qty:', order.qty,\n", " ', order_status:', order_status,\n", " ', exec_price:', np.round(order.exec_price, 1)\n", " )\n", " \n", "@njit\n", "def market_order(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(60 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", "\n", " state_values = hbt.state_values(0)\n", " \n", " print(\n", " 'current_timestamp:', hbt.current_timestamp,\n", " ', position:', hbt.position(0),\n", " ', balance:', state_values.balance,\n", " ', fee:', state_values.fee\n", " )\n", " \n", " if not is_order_submitted:\n", " depth = hbt.depth(0)\n", " \n", " order_id = 1\n", " # Sets an arbitrary price, which does not affect MARKET orders.\n", " order_price = depth.best_bid\n", " order_qty = 1\n", " time_in_force = GTC\n", " order_type = MARKET\n", " hbt.submit_sell_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " hbt.wait_order_response(0, order_id, 1e9)\n", " # You can see the order immediately filled.\n", " # Also you can see the order executed at the best bid which is different from what it was submitted at.\n", " print('best_bid:', depth.best_bid)\n", " print_orders_exec_price(hbt) \n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "code", "execution_count": 22, "id": "0b6003d9", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "best_bid: 61594.100000000006\n", "current_timestamp: 1723161661520000000 , order_id: 1 , order_price: 61594.1 , order_qty: 1.0 , order_status: FILLED , exec_price: 61594.1\n", "current_timestamp: 1723161721520000000 , order_id: 1 , order_price: 61594.1 , order_qty: 1.0 , order_status: FILLED , order_req: NONE\n", "current_timestamp: 1723161721520000000 , position: -1.0 , balance: 61594.100000000006 , fee: 43.11587\n", "current_timestamp: 1723161781520000000 , position: -1.0 , balance: 61594.100000000006 , fee: 43.11587\n", "current_timestamp: 1723161841520000000 , position: -1.0 , balance: 61594.100000000006 , fee: 43.11587\n", "current_timestamp: 1723161901520000000 , position: -1.0 , balance: 61594.100000000006 , fee: 43.11587\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "market_order(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "980ebc4d", "metadata": {}, "source": [ "## GTX, Post-Only注文" ] }, { "cell_type": "code", "execution_count": 23, "id": "1b0255bb", "metadata": {}, "outputs": [], "source": [ "from hftbacktest import GTX\n", "\n", "@njit\n", "def submit_gtx(hbt):\n", " is_order_submitted = False\n", " while hbt.elapse(60 * 1e9) == 0:\n", " print_orders(hbt)\n", " \n", " hbt.clear_inactive_orders(0)\n", " \n", " state_values = hbt.state_values(0)\n", " \n", " print(\n", " 'current_timestamp:', hbt.current_timestamp,\n", " ', position:', hbt.position(0),\n", " ', balance:', state_values.balance,\n", " ', fee:', state_values.fee\n", " )\n", " \n", " if not is_order_submitted:\n", " depth = hbt.depth(0)\n", " \n", " order_id = 1\n", " # Sets a deep price in the opposite side and it will be rejected by GTX.\n", " order_price = depth.best_bid - 100 * depth.tick_size\n", " order_qty = 1\n", " time_in_force = GTX\n", " order_type = LIMIT\n", " hbt.submit_sell_order(0, order_id, order_price, order_qty, time_in_force, order_type, False)\n", " hbt.wait_order_response(0, order_id, 1e9)\n", " is_order_submitted = True\n", " return True" ] }, { "cell_type": "code", "execution_count": 24, "id": "f1030532", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "current_timestamp: 1723161661500000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "current_timestamp: 1723161721520000000 , order_id: 1 , order_price: 61584.1 , order_qty: 1.0 , order_status: EXPIRED , order_req: NONE\n", "current_timestamp: 1723161721520000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "current_timestamp: 1723161781520000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "current_timestamp: 1723161841520000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n", "current_timestamp: 1723161901520000000 , position: 0.0 , balance: 0.0 , fee: 0.0\n" ] } ], "source": [ "hbt = HashMapMarketDepthBacktest([asset])\n", "\n", "submit_gtx(hbt)\n", "\n", "_ = hbt.close()" ] }, { "cell_type": "markdown", "id": "eff830d3", "metadata": {}, "source": [ "## BBOのプロット" ] }, { "cell_type": "code", "execution_count": 25, "id": "641721c6", "metadata": {}, "outputs": [], "source": [ "@njit\n", "def plot_bbo(hbt, local_timestamp, best_bid, best_ask):\n", " while hbt.elapse(1 * 1e9) == 0:\n", " # Records data points\n", " local_timestamp.append(hbt.current_timestamp)\n", "\n", " depth = hbt.depth(0)\n", " \n", " best_bid.append(depth.best_bid)\n", " best_ask.append(depth.best_ask)\n", " return True" ] }, { "cell_type": "code", "execution_count": 26, "id": "9ecfecc8", "metadata": {}, "outputs": [ { "data": { "application/javascript": [ "(function(root) {\n", " function now() {\n", " return new Date();\n", " }\n", "\n", " var force = true;\n", " var py_version = '3.4.2'.replace('rc', '-rc.').replace('.dev', '-dev.');\n", " var reloading = false;\n", " var Bokeh = root.Bokeh;\n", "\n", " if (typeof (root._bokeh_timeout) === \"undefined\" || force) {\n", " root._bokeh_timeout = Date.now() + 5000;\n", " root._bokeh_failed_load = false;\n", " }\n", "\n", " function run_callbacks() {\n", " try {\n", " root._bokeh_onload_callbacks.forEach(function(callback) {\n", " if (callback != null)\n", " callback();\n", " });\n", " } finally {\n", " delete root._bokeh_onload_callbacks;\n", " }\n", " console.debug(\"Bokeh: all callbacks have finished\");\n", " }\n", "\n", " function load_libs(css_urls, js_urls, js_modules, js_exports, callback) {\n", " if (css_urls == null) css_urls = [];\n", " if (js_urls == null) js_urls = [];\n", " if (js_modules == null) js_modules = [];\n", " if (js_exports == null) js_exports = {};\n", "\n", " root._bokeh_onload_callbacks.push(callback);\n", "\n", " if (root._bokeh_is_loading > 0) {\n", " console.debug(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n", " return null;\n", " }\n", " if (js_urls.length === 0 && js_modules.length === 0 && Object.keys(js_exports).length === 0) {\n", " run_callbacks();\n", " return null;\n", " }\n", " if (!reloading) {\n", " console.debug(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n", " }\n", "\n", " function on_load() {\n", " root._bokeh_is_loading--;\n", " if (root._bokeh_is_loading === 0) {\n", " console.debug(\"Bokeh: all BokehJS libraries/stylesheets loaded\");\n", " run_callbacks()\n", " }\n", " }\n", " window._bokeh_on_load = on_load\n", "\n", " function on_error() {\n", " console.error(\"failed to load \" + url);\n", " }\n", "\n", " var skip = [];\n", " if (window.requirejs) {\n", " window.requirejs.config({'packages': {}, 'paths': {}, 'shim': {}});\n", " root._bokeh_is_loading = css_urls.length + 0;\n", " } else {\n", " root._bokeh_is_loading = css_urls.length + js_urls.length + js_modules.length + Object.keys(js_exports).length;\n", " }\n", "\n", " var existing_stylesheets = []\n", " var links = document.getElementsByTagName('link')\n", " for (var i = 0; i < links.length; i++) {\n", " var link = links[i]\n", " if (link.href != null) {\n", "\texisting_stylesheets.push(link.href)\n", " }\n", " }\n", " for (var i = 0; i < css_urls.length; i++) {\n", " var url = css_urls[i];\n", " if (existing_stylesheets.indexOf(url) !== -1) {\n", "\ton_load()\n", "\tcontinue;\n", " }\n", " const element = document.createElement(\"link\");\n", " element.onload = on_load;\n", " element.onerror = on_error;\n", " element.rel = \"stylesheet\";\n", " element.type = \"text/css\";\n", " element.href = url;\n", " console.debug(\"Bokeh: injecting link tag for BokehJS stylesheet: \", url);\n", " document.body.appendChild(element);\n", " } var existing_scripts = []\n", " var scripts = document.getElementsByTagName('script')\n", " for (var i = 0; i < scripts.length; i++) {\n", " var script = scripts[i]\n", " if (script.src != null) {\n", "\texisting_scripts.push(script.src)\n", " }\n", " }\n", " for (var i = 0; i < js_urls.length; i++) {\n", " var url = js_urls[i];\n", " if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n", "\tif (!window.requirejs) {\n", "\t on_load();\n", "\t}\n", "\tcontinue;\n", " }\n", " var element = document.createElement('script');\n", " element.onload = on_load;\n", " element.onerror = on_error;\n", " element.async = false;\n", " element.src = url;\n", " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", " document.head.appendChild(element);\n", " }\n", " for (var i = 0; i < js_modules.length; i++) {\n", " var url = js_modules[i];\n", " if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n", "\tif (!window.requirejs) {\n", "\t on_load();\n", "\t}\n", "\tcontinue;\n", " }\n", " var element = document.createElement('script');\n", " element.onload = on_load;\n", " element.onerror = on_error;\n", " element.async = false;\n", " element.src = url;\n", " element.type = \"module\";\n", " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", " document.head.appendChild(element);\n", " }\n", " for (const name in js_exports) {\n", " var url = js_exports[name];\n", " if (skip.indexOf(url) >= 0 || root[name] != null) {\n", "\tif (!window.requirejs) {\n", "\t on_load();\n", "\t}\n", "\tcontinue;\n", " }\n", " var element = document.createElement('script');\n", " element.onerror = on_error;\n", " element.async = false;\n", " element.type = \"module\";\n", " console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", " element.textContent = `\n", " import ${name} from \"${url}\"\n", " window.${name} = ${name}\n", " window._bokeh_on_load()\n", " `\n", " document.head.appendChild(element);\n", " }\n", " if (!js_urls.length && !js_modules.length) {\n", " on_load()\n", " }\n", " };\n", "\n", " function inject_raw_css(css) {\n", " const element = document.createElement(\"style\");\n", " element.appendChild(document.createTextNode(css));\n", " document.body.appendChild(element);\n", " }\n", "\n", " var js_urls = [\"https://cdn.bokeh.org/bokeh/release/bokeh-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-gl-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-widgets-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-tables-3.4.2.min.js\", \"https://cdn.holoviz.org/panel/1.4.4/dist/panel.min.js\"];\n", " var js_modules = [];\n", " var js_exports = {};\n", " var css_urls = [];\n", " var inline_js = [ function(Bokeh) {\n", " Bokeh.set_log_level(\"info\");\n", " },\n", "function(Bokeh) {} // ensure no trailing comma for IE\n", " ];\n", "\n", " function run_inline_js() {\n", " if ((root.Bokeh !== undefined) || (force === true)) {\n", " for (var i = 0; i < inline_js.length; i++) {\n", "\ttry {\n", " inline_js[i].call(root, root.Bokeh);\n", "\t} catch(e) {\n", "\t if (!reloading) {\n", "\t throw e;\n", "\t }\n", "\t}\n", " }\n", " // Cache old bokeh versions\n", " if (Bokeh != undefined && !reloading) {\n", "\tvar NewBokeh = root.Bokeh;\n", "\tif (Bokeh.versions === undefined) {\n", "\t Bokeh.versions = new Map();\n", "\t}\n", "\tif (NewBokeh.version !== Bokeh.version) {\n", "\t Bokeh.versions.set(NewBokeh.version, NewBokeh)\n", "\t}\n", "\troot.Bokeh = Bokeh;\n", " }} else if (Date.now() < root._bokeh_timeout) {\n", " setTimeout(run_inline_js, 100);\n", " } else if (!root._bokeh_failed_load) {\n", " console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n", " root._bokeh_failed_load = true;\n", " }\n", " root._bokeh_is_initializing = false\n", " }\n", "\n", " function load_or_wait() {\n", " // Implement a backoff loop that tries to ensure we do not load multiple\n", " // versions of Bokeh and its dependencies at the same time.\n", " // In recent versions we use the root._bokeh_is_initializing flag\n", " // to determine whether there is an ongoing attempt to initialize\n", " // bokeh, however for backward compatibility we also try to ensure\n", " // that we do not start loading a newer (Panel>=1.0 and Bokeh>3) version\n", " // before older versions are fully initialized.\n", " if (root._bokeh_is_initializing && Date.now() > root._bokeh_timeout) {\n", " root._bokeh_is_initializing = false;\n", " root._bokeh_onload_callbacks = undefined;\n", " console.log(\"Bokeh: BokehJS was loaded multiple times but one version failed to initialize.\");\n", " load_or_wait();\n", " } else if (root._bokeh_is_initializing || (typeof root._bokeh_is_initializing === \"undefined\" && root._bokeh_onload_callbacks !== undefined)) {\n", " setTimeout(load_or_wait, 100);\n", " } else {\n", " root._bokeh_is_initializing = true\n", " root._bokeh_onload_callbacks = []\n", " var bokeh_loaded = Bokeh != null && (Bokeh.version === py_version || (Bokeh.versions !== undefined && Bokeh.versions.has(py_version)));\n", " if (!reloading && !bokeh_loaded) {\n", "\troot.Bokeh = undefined;\n", " }\n", " load_libs(css_urls, js_urls, js_modules, js_exports, function() {\n", "\tconsole.debug(\"Bokeh: BokehJS plotting callback run at\", now());\n", "\trun_inline_js();\n", " });\n", " }\n", " }\n", " // Give older versions of the autoload script a head-start to ensure\n", " // they initialize before we start loading newer version.\n", " setTimeout(load_or_wait, 100)\n", "}(window));" ], "application/vnd.holoviews_load.v0+json": "(function(root) {\n function now() {\n return new Date();\n }\n\n var force = true;\n var py_version = '3.4.2'.replace('rc', '-rc.').replace('.dev', '-dev.');\n var reloading = false;\n var Bokeh = root.Bokeh;\n\n if (typeof (root._bokeh_timeout) === \"undefined\" || force) {\n root._bokeh_timeout = Date.now() + 5000;\n root._bokeh_failed_load = false;\n }\n\n function run_callbacks() {\n try {\n root._bokeh_onload_callbacks.forEach(function(callback) {\n if (callback != null)\n callback();\n });\n } finally {\n delete root._bokeh_onload_callbacks;\n }\n console.debug(\"Bokeh: all callbacks have finished\");\n }\n\n function load_libs(css_urls, js_urls, js_modules, js_exports, callback) {\n if (css_urls == null) css_urls = [];\n if (js_urls == null) js_urls = [];\n if (js_modules == null) js_modules = [];\n if (js_exports == null) js_exports = {};\n\n root._bokeh_onload_callbacks.push(callback);\n\n if (root._bokeh_is_loading > 0) {\n console.debug(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n return null;\n }\n if (js_urls.length === 0 && js_modules.length === 0 && Object.keys(js_exports).length === 0) {\n run_callbacks();\n return null;\n }\n if (!reloading) {\n console.debug(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n }\n\n function on_load() {\n root._bokeh_is_loading--;\n if (root._bokeh_is_loading === 0) {\n console.debug(\"Bokeh: all BokehJS libraries/stylesheets loaded\");\n run_callbacks()\n }\n }\n window._bokeh_on_load = on_load\n\n function on_error() {\n console.error(\"failed to load \" + url);\n }\n\n var skip = [];\n if (window.requirejs) {\n window.requirejs.config({'packages': {}, 'paths': {}, 'shim': {}});\n root._bokeh_is_loading = css_urls.length + 0;\n } else {\n root._bokeh_is_loading = css_urls.length + js_urls.length + js_modules.length + Object.keys(js_exports).length;\n }\n\n var existing_stylesheets = []\n var links = document.getElementsByTagName('link')\n for (var i = 0; i < links.length; i++) {\n var link = links[i]\n if (link.href != null) {\n\texisting_stylesheets.push(link.href)\n }\n }\n for (var i = 0; i < css_urls.length; i++) {\n var url = css_urls[i];\n if (existing_stylesheets.indexOf(url) !== -1) {\n\ton_load()\n\tcontinue;\n }\n const element = document.createElement(\"link\");\n element.onload = on_load;\n element.onerror = on_error;\n element.rel = \"stylesheet\";\n element.type = \"text/css\";\n element.href = url;\n console.debug(\"Bokeh: injecting link tag for BokehJS stylesheet: \", url);\n document.body.appendChild(element);\n } var existing_scripts = []\n var scripts = document.getElementsByTagName('script')\n for (var i = 0; i < scripts.length; i++) {\n var script = scripts[i]\n if (script.src != null) {\n\texisting_scripts.push(script.src)\n }\n }\n for (var i = 0; i < js_urls.length; i++) {\n var url = js_urls[i];\n if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onload = on_load;\n element.onerror = on_error;\n element.async = false;\n element.src = url;\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n document.head.appendChild(element);\n }\n for (var i = 0; i < js_modules.length; i++) {\n var url = js_modules[i];\n if (skip.indexOf(url) !== -1 || existing_scripts.indexOf(url) !== -1) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onload = on_load;\n element.onerror = on_error;\n element.async = false;\n element.src = url;\n element.type = \"module\";\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n document.head.appendChild(element);\n }\n for (const name in js_exports) {\n var url = js_exports[name];\n if (skip.indexOf(url) >= 0 || root[name] != null) {\n\tif (!window.requirejs) {\n\t on_load();\n\t}\n\tcontinue;\n }\n var element = document.createElement('script');\n element.onerror = on_error;\n element.async = false;\n element.type = \"module\";\n console.debug(\"Bokeh: injecting script tag for BokehJS library: \", url);\n element.textContent = `\n import ${name} from \"${url}\"\n window.${name} = ${name}\n window._bokeh_on_load()\n `\n document.head.appendChild(element);\n }\n if (!js_urls.length && !js_modules.length) {\n on_load()\n }\n };\n\n function inject_raw_css(css) {\n const element = document.createElement(\"style\");\n element.appendChild(document.createTextNode(css));\n document.body.appendChild(element);\n }\n\n var js_urls = [\"https://cdn.bokeh.org/bokeh/release/bokeh-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-gl-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-widgets-3.4.2.min.js\", \"https://cdn.bokeh.org/bokeh/release/bokeh-tables-3.4.2.min.js\", \"https://cdn.holoviz.org/panel/1.4.4/dist/panel.min.js\"];\n var js_modules = [];\n var js_exports = {};\n var css_urls = [];\n var inline_js = [ function(Bokeh) {\n Bokeh.set_log_level(\"info\");\n },\nfunction(Bokeh) {} // ensure no trailing comma for IE\n ];\n\n function run_inline_js() {\n if ((root.Bokeh !== undefined) || (force === true)) {\n for (var i = 0; i < inline_js.length; i++) {\n\ttry {\n inline_js[i].call(root, root.Bokeh);\n\t} catch(e) {\n\t if (!reloading) {\n\t throw e;\n\t }\n\t}\n }\n // Cache old bokeh versions\n if (Bokeh != undefined && !reloading) {\n\tvar NewBokeh = root.Bokeh;\n\tif (Bokeh.versions === undefined) {\n\t Bokeh.versions = new Map();\n\t}\n\tif (NewBokeh.version !== Bokeh.version) {\n\t Bokeh.versions.set(NewBokeh.version, NewBokeh)\n\t}\n\troot.Bokeh = Bokeh;\n }} else if (Date.now() < root._bokeh_timeout) {\n setTimeout(run_inline_js, 100);\n } else if (!root._bokeh_failed_load) {\n console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n root._bokeh_failed_load = true;\n }\n root._bokeh_is_initializing = false\n }\n\n function load_or_wait() {\n // Implement a backoff loop that tries to ensure we do not load multiple\n // versions of Bokeh and its dependencies at the same time.\n // In recent versions we use the root._bokeh_is_initializing flag\n // to determine whether there is an ongoing attempt to initialize\n // bokeh, however for backward compatibility we also try to ensure\n // that we do not start loading a newer (Panel>=1.0 and Bokeh>3) version\n // before older versions are fully initialized.\n if (root._bokeh_is_initializing && Date.now() > root._bokeh_timeout) {\n root._bokeh_is_initializing = false;\n root._bokeh_onload_callbacks = undefined;\n console.log(\"Bokeh: BokehJS was loaded multiple times but one version failed to initialize.\");\n load_or_wait();\n } else if (root._bokeh_is_initializing || (typeof root._bokeh_is_initializing === \"undefined\" && root._bokeh_onload_callbacks !== undefined)) {\n setTimeout(load_or_wait, 100);\n } else {\n root._bokeh_is_initializing = true\n root._bokeh_onload_callbacks = []\n var bokeh_loaded = Bokeh != null && (Bokeh.version === py_version || (Bokeh.versions !== undefined && Bokeh.versions.has(py_version)));\n if (!reloading && !bokeh_loaded) {\n\troot.Bokeh = undefined;\n }\n load_libs(css_urls, js_urls, js_modules, js_exports, function() {\n\tconsole.debug(\"Bokeh: BokehJS plotting callback run at\", now());\n\trun_inline_js();\n });\n }\n }\n // Give older versions of the autoload script a head-start to ensure\n // they initialize before we start loading newer version.\n setTimeout(load_or_wait, 100)\n}(window));" }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/javascript": [ "\n", "if ((window.PyViz === undefined) || (window.PyViz instanceof HTMLElement)) {\n", " window.PyViz = {comms: {}, comm_status:{}, kernels:{}, receivers: {}, plot_index: []}\n", "}\n", "\n", "\n", " function JupyterCommManager() {\n", " }\n", "\n", " JupyterCommManager.prototype.register_target = function(plot_id, comm_id, msg_handler) {\n", " if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n", " var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n", " comm_manager.register_target(comm_id, function(comm) {\n", " comm.on_msg(msg_handler);\n", " });\n", " } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n", " window.PyViz.kernels[plot_id].registerCommTarget(comm_id, function(comm) {\n", " comm.onMsg = msg_handler;\n", " });\n", " } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n", " google.colab.kernel.comms.registerTarget(comm_id, (comm) => {\n", " var messages = comm.messages[Symbol.asyncIterator]();\n", " function processIteratorResult(result) {\n", " var message = result.value;\n", " console.log(message)\n", " var content = {data: message.data, comm_id};\n", " var buffers = []\n", " for (var buffer of message.buffers || []) {\n", " buffers.push(new DataView(buffer))\n", " }\n", " var metadata = message.metadata || {};\n", " var msg = {content, buffers, metadata}\n", " msg_handler(msg);\n", " return messages.next().then(processIteratorResult);\n", " }\n", " return messages.next().then(processIteratorResult);\n", " })\n", " }\n", " }\n", "\n", " JupyterCommManager.prototype.get_client_comm = function(plot_id, comm_id, msg_handler) {\n", " if (comm_id in window.PyViz.comms) {\n", " return window.PyViz.comms[comm_id];\n", " } else if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n", " var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n", " var comm = comm_manager.new_comm(comm_id, {}, {}, {}, comm_id);\n", " if (msg_handler) {\n", " comm.on_msg(msg_handler);\n", " }\n", " } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n", " var comm = window.PyViz.kernels[plot_id].connectToComm(comm_id);\n", " comm.open();\n", " if (msg_handler) {\n", " comm.onMsg = msg_handler;\n", " }\n", " } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n", " var comm_promise = google.colab.kernel.comms.open(comm_id)\n", " comm_promise.then((comm) => {\n", " window.PyViz.comms[comm_id] = comm;\n", " if (msg_handler) {\n", " var messages = comm.messages[Symbol.asyncIterator]();\n", " function processIteratorResult(result) {\n", " var message = result.value;\n", " var content = {data: message.data};\n", " var metadata = message.metadata || {comm_id};\n", " var msg = {content, metadata}\n", " msg_handler(msg);\n", " return messages.next().then(processIteratorResult);\n", " }\n", " return messages.next().then(processIteratorResult);\n", " }\n", " }) \n", " var sendClosure = (data, metadata, buffers, disposeOnDone) => {\n", " return comm_promise.then((comm) => {\n", " comm.send(data, metadata, buffers, disposeOnDone);\n", " });\n", " };\n", " var comm = {\n", " send: sendClosure\n", " };\n", " }\n", " window.PyViz.comms[comm_id] = comm;\n", " return comm;\n", " }\n", " window.PyViz.comm_manager = new JupyterCommManager();\n", " \n", "\n", "\n", "var JS_MIME_TYPE = 'application/javascript';\n", "var HTML_MIME_TYPE = 'text/html';\n", "var EXEC_MIME_TYPE = 'application/vnd.holoviews_exec.v0+json';\n", "var CLASS_NAME = 'output';\n", "\n", "/**\n", " * Render data to the DOM node\n", " */\n", "function render(props, node) {\n", " var div = document.createElement(\"div\");\n", " var script = document.createElement(\"script\");\n", " node.appendChild(div);\n", " node.appendChild(script);\n", "}\n", "\n", "/**\n", " * Handle when a new output is added\n", " */\n", "function handle_add_output(event, handle) {\n", " var output_area = handle.output_area;\n", " var output = handle.output;\n", " if ((output.data == undefined) || (!output.data.hasOwnProperty(EXEC_MIME_TYPE))) {\n", " return\n", " }\n", " var id = output.metadata[EXEC_MIME_TYPE][\"id\"];\n", " var toinsert = output_area.element.find(\".\" + CLASS_NAME.split(' ')[0]);\n", " if (id !== undefined) {\n", " var nchildren = toinsert.length;\n", " var html_node = toinsert[nchildren-1].children[0];\n", " html_node.innerHTML = output.data[HTML_MIME_TYPE];\n", " var scripts = [];\n", " var nodelist = html_node.querySelectorAll(\"script\");\n", " for (var i in nodelist) {\n", " if (nodelist.hasOwnProperty(i)) {\n", " scripts.push(nodelist[i])\n", " }\n", " }\n", "\n", " scripts.forEach( function (oldScript) {\n", " var newScript = document.createElement(\"script\");\n", " var attrs = [];\n", " var nodemap = oldScript.attributes;\n", " for (var j in nodemap) {\n", " if (nodemap.hasOwnProperty(j)) {\n", " attrs.push(nodemap[j])\n", " }\n", " }\n", " attrs.forEach(function(attr) { newScript.setAttribute(attr.name, attr.value) });\n", " newScript.appendChild(document.createTextNode(oldScript.innerHTML));\n", " oldScript.parentNode.replaceChild(newScript, oldScript);\n", " });\n", " if (JS_MIME_TYPE in output.data) {\n", " toinsert[nchildren-1].children[1].textContent = output.data[JS_MIME_TYPE];\n", " }\n", " output_area._hv_plot_id = id;\n", " if ((window.Bokeh !== undefined) && (id in Bokeh.index)) {\n", " window.PyViz.plot_index[id] = Bokeh.index[id];\n", " } else {\n", " window.PyViz.plot_index[id] = null;\n", " }\n", " } else if (output.metadata[EXEC_MIME_TYPE][\"server_id\"] !== undefined) {\n", " var bk_div = document.createElement(\"div\");\n", " bk_div.innerHTML = output.data[HTML_MIME_TYPE];\n", " var script_attrs = bk_div.children[0].attributes;\n", " for (var i = 0; i < script_attrs.length; i++) {\n", " toinsert[toinsert.length - 1].childNodes[1].setAttribute(script_attrs[i].name, script_attrs[i].value);\n", " }\n", " // store reference to server id on output_area\n", " output_area._bokeh_server_id = output.metadata[EXEC_MIME_TYPE][\"server_id\"];\n", " }\n", "}\n", "\n", "/**\n", " * Handle when an output is cleared or removed\n", " */\n", "function handle_clear_output(event, handle) {\n", " var id = handle.cell.output_area._hv_plot_id;\n", " var server_id = handle.cell.output_area._bokeh_server_id;\n", " if (((id === undefined) || !(id in PyViz.plot_index)) && (server_id !== undefined)) { return; }\n", " var comm = window.PyViz.comm_manager.get_client_comm(\"hv-extension-comm\", \"hv-extension-comm\", function () {});\n", " if (server_id !== null) {\n", " comm.send({event_type: 'server_delete', 'id': server_id});\n", " return;\n", " } else if (comm !== null) {\n", " comm.send({event_type: 'delete', 'id': id});\n", " }\n", " delete PyViz.plot_index[id];\n", " if ((window.Bokeh !== undefined) & (id in window.Bokeh.index)) {\n", " var doc = window.Bokeh.index[id].model.document\n", " doc.clear();\n", " const i = window.Bokeh.documents.indexOf(doc);\n", " if (i > -1) {\n", " window.Bokeh.documents.splice(i, 1);\n", " }\n", " }\n", "}\n", "\n", "/**\n", " * Handle kernel restart event\n", " */\n", "function handle_kernel_cleanup(event, handle) {\n", " delete PyViz.comms[\"hv-extension-comm\"];\n", " window.PyViz.plot_index = {}\n", "}\n", "\n", "/**\n", " * Handle update_display_data messages\n", " */\n", "function handle_update_output(event, handle) {\n", " handle_clear_output(event, {cell: {output_area: handle.output_area}})\n", " handle_add_output(event, handle)\n", "}\n", "\n", "function register_renderer(events, OutputArea) {\n", " function append_mime(data, metadata, element) {\n", " // create a DOM node to render to\n", " var toinsert = this.create_output_subarea(\n", " metadata,\n", " CLASS_NAME,\n", " EXEC_MIME_TYPE\n", " );\n", " this.keyboard_manager.register_events(toinsert);\n", " // Render to node\n", " var props = {data: data, metadata: metadata[EXEC_MIME_TYPE]};\n", " render(props, toinsert[0]);\n", " element.append(toinsert);\n", " return toinsert\n", " }\n", "\n", " events.on('output_added.OutputArea', handle_add_output);\n", " events.on('output_updated.OutputArea', handle_update_output);\n", " events.on('clear_output.CodeCell', handle_clear_output);\n", " events.on('delete.Cell', handle_clear_output);\n", " events.on('kernel_ready.Kernel', handle_kernel_cleanup);\n", "\n", " OutputArea.prototype.register_mime_type(EXEC_MIME_TYPE, append_mime, {\n", " safe: true,\n", " index: 0\n", " });\n", "}\n", "\n", "if (window.Jupyter !== undefined) {\n", " try {\n", " var events = require('base/js/events');\n", " var OutputArea = require('notebook/js/outputarea').OutputArea;\n", " if (OutputArea.prototype.mime_types().indexOf(EXEC_MIME_TYPE) == -1) {\n", " register_renderer(events, OutputArea);\n", " }\n", " } catch(err) {\n", " }\n", "}\n" ], "application/vnd.holoviews_load.v0+json": "\nif ((window.PyViz === undefined) || (window.PyViz instanceof HTMLElement)) {\n window.PyViz = {comms: {}, comm_status:{}, kernels:{}, receivers: {}, plot_index: []}\n}\n\n\n function JupyterCommManager() {\n }\n\n JupyterCommManager.prototype.register_target = function(plot_id, comm_id, msg_handler) {\n if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n comm_manager.register_target(comm_id, function(comm) {\n comm.on_msg(msg_handler);\n });\n } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n window.PyViz.kernels[plot_id].registerCommTarget(comm_id, function(comm) {\n comm.onMsg = msg_handler;\n });\n } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n google.colab.kernel.comms.registerTarget(comm_id, (comm) => {\n var messages = comm.messages[Symbol.asyncIterator]();\n function processIteratorResult(result) {\n var message = result.value;\n console.log(message)\n var content = {data: message.data, comm_id};\n var buffers = []\n for (var buffer of message.buffers || []) {\n buffers.push(new DataView(buffer))\n }\n var metadata = message.metadata || {};\n var msg = {content, buffers, metadata}\n msg_handler(msg);\n return messages.next().then(processIteratorResult);\n }\n return messages.next().then(processIteratorResult);\n })\n }\n }\n\n JupyterCommManager.prototype.get_client_comm = function(plot_id, comm_id, msg_handler) {\n if (comm_id in window.PyViz.comms) {\n return window.PyViz.comms[comm_id];\n } else if (window.comm_manager || ((window.Jupyter !== undefined) && (Jupyter.notebook.kernel != null))) {\n var comm_manager = window.comm_manager || Jupyter.notebook.kernel.comm_manager;\n var comm = comm_manager.new_comm(comm_id, {}, {}, {}, comm_id);\n if (msg_handler) {\n comm.on_msg(msg_handler);\n }\n } else if ((plot_id in window.PyViz.kernels) && (window.PyViz.kernels[plot_id])) {\n var comm = window.PyViz.kernels[plot_id].connectToComm(comm_id);\n comm.open();\n if (msg_handler) {\n comm.onMsg = msg_handler;\n }\n } else if (typeof google != 'undefined' && google.colab.kernel != null) {\n var comm_promise = google.colab.kernel.comms.open(comm_id)\n comm_promise.then((comm) => {\n window.PyViz.comms[comm_id] = comm;\n if (msg_handler) {\n var messages = comm.messages[Symbol.asyncIterator]();\n function processIteratorResult(result) {\n var message = result.value;\n var content = {data: message.data};\n var metadata = message.metadata || {comm_id};\n var msg = {content, metadata}\n msg_handler(msg);\n return messages.next().then(processIteratorResult);\n }\n return messages.next().then(processIteratorResult);\n }\n }) \n var sendClosure = (data, metadata, buffers, disposeOnDone) => {\n return comm_promise.then((comm) => {\n comm.send(data, metadata, buffers, disposeOnDone);\n });\n };\n var comm = {\n send: sendClosure\n };\n }\n window.PyViz.comms[comm_id] = comm;\n return comm;\n }\n window.PyViz.comm_manager = new JupyterCommManager();\n \n\n\nvar JS_MIME_TYPE = 'application/javascript';\nvar HTML_MIME_TYPE = 'text/html';\nvar EXEC_MIME_TYPE = 'application/vnd.holoviews_exec.v0+json';\nvar CLASS_NAME = 'output';\n\n/**\n * Render data to the DOM node\n */\nfunction render(props, node) {\n var div = document.createElement(\"div\");\n var script = document.createElement(\"script\");\n node.appendChild(div);\n node.appendChild(script);\n}\n\n/**\n * Handle when a new output is added\n */\nfunction handle_add_output(event, handle) {\n var output_area = handle.output_area;\n var output = handle.output;\n if ((output.data == undefined) || (!output.data.hasOwnProperty(EXEC_MIME_TYPE))) {\n return\n }\n var id = output.metadata[EXEC_MIME_TYPE][\"id\"];\n var toinsert = output_area.element.find(\".\" + CLASS_NAME.split(' ')[0]);\n if (id !== undefined) {\n var nchildren = toinsert.length;\n var html_node = toinsert[nchildren-1].children[0];\n html_node.innerHTML = output.data[HTML_MIME_TYPE];\n var scripts = [];\n var nodelist = html_node.querySelectorAll(\"script\");\n for (var i in nodelist) {\n if (nodelist.hasOwnProperty(i)) {\n scripts.push(nodelist[i])\n }\n }\n\n scripts.forEach( function (oldScript) {\n var newScript = document.createElement(\"script\");\n var attrs = [];\n var nodemap = oldScript.attributes;\n for (var j in nodemap) {\n if (nodemap.hasOwnProperty(j)) {\n attrs.push(nodemap[j])\n }\n }\n attrs.forEach(function(attr) { newScript.setAttribute(attr.name, attr.value) });\n newScript.appendChild(document.createTextNode(oldScript.innerHTML));\n oldScript.parentNode.replaceChild(newScript, oldScript);\n });\n if (JS_MIME_TYPE in output.data) {\n toinsert[nchildren-1].children[1].textContent = output.data[JS_MIME_TYPE];\n }\n output_area._hv_plot_id = id;\n if ((window.Bokeh !== undefined) && (id in Bokeh.index)) {\n window.PyViz.plot_index[id] = Bokeh.index[id];\n } else {\n window.PyViz.plot_index[id] = null;\n }\n } else if (output.metadata[EXEC_MIME_TYPE][\"server_id\"] !== undefined) {\n var bk_div = document.createElement(\"div\");\n bk_div.innerHTML = output.data[HTML_MIME_TYPE];\n var script_attrs = bk_div.children[0].attributes;\n for (var i = 0; i < script_attrs.length; i++) {\n toinsert[toinsert.length - 1].childNodes[1].setAttribute(script_attrs[i].name, script_attrs[i].value);\n }\n // store reference to server id on output_area\n output_area._bokeh_server_id = output.metadata[EXEC_MIME_TYPE][\"server_id\"];\n }\n}\n\n/**\n * Handle when an output is cleared or removed\n */\nfunction handle_clear_output(event, handle) {\n var id = handle.cell.output_area._hv_plot_id;\n var server_id = handle.cell.output_area._bokeh_server_id;\n if (((id === undefined) || !(id in PyViz.plot_index)) && (server_id !== undefined)) { return; }\n var comm = window.PyViz.comm_manager.get_client_comm(\"hv-extension-comm\", \"hv-extension-comm\", function () {});\n if (server_id !== null) {\n comm.send({event_type: 'server_delete', 'id': server_id});\n return;\n } else if (comm !== null) {\n comm.send({event_type: 'delete', 'id': id});\n }\n delete PyViz.plot_index[id];\n if ((window.Bokeh !== undefined) & (id in window.Bokeh.index)) {\n var doc = window.Bokeh.index[id].model.document\n doc.clear();\n const i = window.Bokeh.documents.indexOf(doc);\n if (i > -1) {\n window.Bokeh.documents.splice(i, 1);\n }\n }\n}\n\n/**\n * Handle kernel restart event\n */\nfunction handle_kernel_cleanup(event, handle) {\n delete PyViz.comms[\"hv-extension-comm\"];\n window.PyViz.plot_index = {}\n}\n\n/**\n * Handle update_display_data messages\n */\nfunction handle_update_output(event, handle) {\n handle_clear_output(event, {cell: {output_area: handle.output_area}})\n handle_add_output(event, handle)\n}\n\nfunction register_renderer(events, OutputArea) {\n function append_mime(data, metadata, element) {\n // create a DOM node to render to\n var toinsert = this.create_output_subarea(\n metadata,\n CLASS_NAME,\n EXEC_MIME_TYPE\n );\n this.keyboard_manager.register_events(toinsert);\n // Render to node\n var props = {data: data, metadata: metadata[EXEC_MIME_TYPE]};\n render(props, toinsert[0]);\n element.append(toinsert);\n return toinsert\n }\n\n events.on('output_added.OutputArea', handle_add_output);\n events.on('output_updated.OutputArea', handle_update_output);\n events.on('clear_output.CodeCell', handle_clear_output);\n events.on('delete.Cell', handle_clear_output);\n events.on('kernel_ready.Kernel', handle_kernel_cleanup);\n\n OutputArea.prototype.register_mime_type(EXEC_MIME_TYPE, append_mime, {\n safe: true,\n index: 0\n });\n}\n\nif (window.Jupyter !== undefined) {\n try {\n var events = require('base/js/events');\n var OutputArea = require('notebook/js/outputarea').OutputArea;\n if (OutputArea.prototype.mime_types().indexOf(EXEC_MIME_TYPE) == -1) {\n register_renderer(events, OutputArea);\n }\n } catch(err) {\n }\n}\n" }, "metadata": {}, "output_type": "display_data" }, { "data": { "text/html": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.holoviews_exec.v0+json": "", "text/html": [ "
start | end | SR | Sortino | Return | MaxDrawdown | DailyNumberOfTrades | DailyTradingValue | ReturnOverMDD | ReturnOverTrade | MaxPositionValue |
---|---|---|---|---|---|---|---|---|---|---|
datetime[μs] | datetime[μs] | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 | f64 |
2024-08-09 00:00:00 | 2024-08-09 00:05:00 | -624.497686 | -664.628958 | -1846.54472 | 1902.18778 | 50688.0 | 3.1236e9 | -0.970748 | -0.00017 | 553849.65 |